Someone mentioned to me today that Bruce Clay has done a great job of optimizing his site over the past year, and reclaimed some of those top positions for broad level SEO terms. One of these is search engine optimization, where Bruce ranks in the 2nd spot:
When I analyze why a site/page is ranking well, I’m typically looking at:
- Page Strength, which is good for a quick overview – 6.5
- Links to the domain via Y! Site Explorer – Bruce has – 41,314
- Number of unique domains in the top 2-300 – I estimate around 50% in the top 250 are from unique domains
- Ratio of high/low quality links – usually I do this with a manual review
- Who’s linking and why (natural, manipulated, paid, etc.) – again, manual review req’d
- Anchor text (the neat-o tool is a must for this) – lots of instances of “search engine optimization,” which surely helps 🙂
- Links to the page from Bruce’s site – 1,070
- Links to the page from external sites – 7,757
- Links that Technorati knows about – 1,206
- How frequently are new blog links appearing (just look at the included timeline next to the links)
- What authority levels do those links have (sort by authority to see)
- A bit of KW usage data about the page – The Ranks.nl tool is still the best
- Google PageRank of the domain and page (not a great one, but it’s still a signal) – Page = 6, Domain = 7
- How it ranks at Ask.com (with their picky local pop algorithm) – 5
- The strength of who else is ranking in the top 10 (WBP’s cool seo tool is good for this, as is SEOmoz’s own KW Difficulty)
All in all, Bruce’s page is a solid competitor with a lot of strong factors backing it up. My only concern with the page as it stands now would be its effectiveness over time in attracting new, natural links. If you look over the page from a human perspective, my guess is that very few folks would be likely to send links to that page after visiting – it’s more sales and navigation than resource.
What’s your strategy for investigating a site’s ranking ability?